5,040 research outputs found

    Deep Chronnectome Learning via Full Bidirectional Long Short-Term Memory Networks for MCI Diagnosis

    Full text link
    Brain functional connectivity (FC) extracted from resting-state fMRI (RS-fMRI) has become a popular approach for disease diagnosis, where discriminating subjects with mild cognitive impairment (MCI) from normal controls (NC) is still one of the most challenging problems. Dynamic functional connectivity (dFC), consisting of time-varying spatiotemporal dynamics, may characterize "chronnectome" diagnostic information for improving MCI classification. However, most of the current dFC studies are based on detecting discrete major brain status via spatial clustering, which ignores rich spatiotemporal dynamics contained in such chronnectome. We propose Deep Chronnectome Learning for exhaustively mining the comprehensive information, especially the hidden higher-level features, i.e., the dFC time series that may add critical diagnostic power for MCI classification. To this end, we devise a new Fully-connected Bidirectional Long Short-Term Memory Network (Full-BiLSTM) to effectively learn the periodic brain status changes using both past and future information for each brief time segment and then fuse them to form the final output. We have applied our method to a rigorously built large-scale multi-site database (i.e., with 164 data from NCs and 330 from MCIs, which can be further augmented by 25 folds). Our method outperforms other state-of-the-art approaches with an accuracy of 73.6% under solid cross-validations. We also made extensive comparisons among multiple variants of LSTM models. The results suggest high feasibility of our method with promising value also for other brain disorder diagnoses.Comment: The paper has been accepted by MICCAI201

    Polar methane accumulation and rainstorms on Titan from simulations of the methane cycle

    Get PDF
    Titan has a methane cycle akin to Earth's water cycle. It has lakes in polar regions, preferentially in the north; dry low latitudes with fluvial features and occasional rainstorms; and tropospheric clouds mainly (so far) in southern middle latitudes and polar regions. Previous models have explained the low-latitude dryness as a result of atmospheric methane transport into middle and high latitudes. Hitherto, no model has explained why lakes are found only in polar regions and preferentially in the north; how low-latitude rainstorms arise; or why clouds cluster in southern middle and high latitudes. Here we report simulations with a three-dimensional atmospheric model coupled to a dynamic surface reservoir of methane. We find that methane is cold-trapped and accumulates in polar regions, preferentially in the north because the northern summer, at aphelion, is longer and has greater net precipitation than the southern summer. The net precipitation in polar regions is balanced in the annual mean by slow along-surface methane transport towards mid-latitudes, and subsequent evaporation. In low latitudes, rare but intense storms occur around the equinoxes, producing enough precipitation to carve surface features. Tropospheric clouds form primarily in middle and high latitudes of the summer hemisphere, which until recently has been the southern hemisphere. We predict that in the northern polar region, prominent clouds will form within about two (Earth) years and lake levels will rise over the next fifteen years

    Neural NILM: Deep Neural Networks Applied to Energy Disaggregation

    Get PDF
    Energy disaggregation estimates appliance-by-appliance electricity consumption from a single meter that measures the whole home's electricity demand. Recently, deep neural networks have driven remarkable improvements in classification performance in neighbouring machine learning fields such as image classification and automatic speech recognition. In this paper, we adapt three deep neural network architectures to energy disaggregation: 1) a form of recurrent neural network called `long short-term memory' (LSTM); 2) denoising autoencoders; and 3) a network which regresses the start time, end time and average power demand of each appliance activation. We use seven metrics to test the performance of these algorithms on real aggregate power data from five appliances. Tests are performed against a house not seen during training and against houses seen during training. We find that all three neural nets achieve better F1 scores (averaged over all five appliances) than either combinatorial optimisation or factorial hidden Markov models and that our neural net algorithms generalise well to an unseen house.Comment: To appear in ACM BuildSys'15, November 4--5, 2015, Seou

    The Australian Orthopaedic Association National Joint Replacement Registry

    Get PDF
    The document attached has been archived with permission from the editor of the Medical Journal of Australia. An external link to the publisherā€™s copy is included.In the financial year ending June 2002, 26 689 hip replacements and 26089 knee replacements (total, 52778) were performed in Australia. Hip and knee replacement procedures have increased between 5%-10% each year for the past 10 years, with a combined increase in hip and knee replacement of 13.4% in the past year. The revision rate for hip replacement surgery in Australia is unknown but is estimated to be 20%-24%; the revision rate for hip replacement surgery in Sweden is 7%. Although data collection for the Registry is voluntary, it has 100% compliance from hospitals undertaking joint-replacement surgery.Stephen E Graves, David Davidson, Lisa Ingerson, Philip Ryan, Elizabeth C Griffith, Brian F J McDermott, Heather J McElroy and Nicole L Prat

    New Orientia tsutsugamushi strain from scrub typhus in Australia.

    Get PDF
    In a recent case of scrub typhus in Australia, Orientia tsutsugamushi isolated from the patient's blood was tested by sequence analysis of the 16S rDNA gene. The sequence showed a strain of O. tsutsugamushi that was quite different from the classic Karp, Kato, and Gilliam strains. The new strain has been designated Litchfield

    An Anisotropic Wormhole:TUNNELLING in Time and Space

    Full text link
    We discuss the structure of a gravitational euclidean instanton obtained through coupling of gravity to electromagnetism. Its topology at fixed tt is S1ƗS2S^1\times S^2. This euclidean solution can be interpreted as a tunnelling to a hyperbolic space (baby universe) at t=0t=0 or alternatively as a static wormhole that joins the two asymptotically flat spaces of a Reissner--Nordstr\"om type solution with M=0M=0.Comment: PLAIN-TEX, 16 pages (4 figures not included), Report DFTT 2/9

    Effect of Statistical Fluctuation in Monte Carlo Based Photon Beam Dose Calculation on Gamma Index Evaluation

    Full text link
    The gamma-index test has been commonly adopted to quantify the degree of agreement between a reference dose distribution and an evaluation dose distribution. Monte Carlo (MC) simulation has been widely used for the radiotherapy dose calculation for both clinical and research purposes. The goal of this work is to investigate both theoretically and experimentally the impact of the MC statistical fluctuation on the gamma-index test when the fluctuation exists in the reference, the evaluation, or both dose distributions. To the first order approximation, we theoretically demonstrated in a simplified model that the statistical fluctuation tends to overestimate gamma-index values when existing in the reference dose distribution and underestimate gamma-index values when existing in the evaluation dose distribution given the original gamma-index is relatively large for the statistical fluctuation. Our numerical experiments using clinical photon radiation therapy cases have shown that 1) when performing a gamma-index test between an MC reference dose and a non-MC evaluation dose, the average gamma-index is overestimated and the passing rate decreases with the increase of the noise level in the reference dose; 2) when performing a gamma-index test between a non-MC reference dose and an MC evaluation dose, the average gamma-index is underestimated when they are within the clinically relevant range and the passing rate increases with the increase of the noise level in the evaluation dose; 3) when performing a gamma-index test between an MC reference dose and an MC evaluation dose, the passing rate is overestimated due to the noise in the evaluation dose and underestimated due to the noise in the reference dose. We conclude that the gamma-index test should be used with caution when comparing dose distributions computed with Monte Carlo simulation

    Compact x-ray source based on burst-mode inverse Compton scattering at 100 kHz

    Get PDF
    A design for a compact x-ray light source (CXLS) with flux and brilliance orders of magnitude beyond existing laboratory scale sources is presented. The source is based on inverse Compton scattering of a high brightness electron bunch on a picosecond laser pulse. The accelerator is a novel high-efficiency standing-wave linac and RF photoinjector powered by a single ultrastable RF transmitter at x-band RF frequency. The high efficiency permits operation at repetition rates up to 1 kHz, which is further boosted to 100 kHz by operating with trains of 100 bunches of 100 pC charge, each separated by 5 ns. The entire accelerator is approximately 1 meter long and produces hard x-rays tunable over a wide range of photon energies. The colliding laser is a Yb:YAG solid-state amplifier producing 1030 nm, 100 mJ pulses at the same 1 kHz repetition rate as the accelerator. The laser pulse is frequency-doubled and stored for many passes in a ringdown cavity to match the linac pulse structure. At a photon energy of 12.4 keV, the predicted x-ray flux is 5Ɨ10115 \times 10^{11} photons/second in a 5% bandwidth and the brilliance is 2Ɨ1012photons/(secĀ mm2Ā mrad2Ā 0.1%)2 \times 10^{12}\mathrm{photons/(sec\ mm^2\ mrad^2\ 0.1\%)} in pulses with RMS pulse length of 490 fs. The nominal electron beam parameters are 18 MeV kinetic energy, 10 microamp average current, 0.5 microsecond macropulse length, resulting in average electron beam power of 180 W. Optimization of the x-ray output is presented along with design of the accelerator, laser, and x-ray optic components that are specific to the particular characteristics of the Compton scattered x-ray pulses.Comment: 25 pages, 24 figures, 54 reference

    Atomistic Simulations of Nanotube Fracture

    Full text link
    The fracture of carbon nanotubes is studied by atomistic simulations. The fracture behavior is found to be almost independent of the separation energy and to depend primarily on the inflection point in the interatomic potential. The rangle of fracture strians compares well with experimental results, but predicted range of fracture stresses is marketly higher than observed. Various plausible small-scale defects do not suffice to bring the failure stresses into agreement with available experimental results. As in the experiments, the fracture of carbon nanotubes is predicted to be brittle. The results show moderate dependence of fracture strength on chirality.Comment: 12 pages, PDF, submitted to Phy. Rev.

    Developing reading-writing connections; the impact of explicit instruction of literary devices on the quality of children's narrative writing

    Get PDF
    The purpose of this collaborative schools-university study was to investigate how the explicit instruction of literary devices during designated literacy sessions could improve the quality of children's narrative writing. A guiding question for the study was: Can children's writing can be enhanced by teachers drawing attention to the literary devices used by professional writers or ā€œmentor authorsā€? The study was conducted with 18 teachers, working as research partners in nine elementary schools over one school year. The research group explored ways of developing children as reflective authors, able to draft and redraft writing in response to peer and teacher feedback. Daily literacy sessions were complemented by weekly writing workshops where students engaged in authorial activity and experienced writers' perspectives and readers' demands (Harwayne, 1992; May, 2004). Methods for data collection included video recording of peer-peer and teacher-led group discussions and audio recording of teacher-child conferences. Samples of children's narrative writing were collected and a comparison was made between the quality of their independent writing at the beginning and end of the research period. The research group documented the importance of peer-peer and teacher-student discourse in the development of children's metalanguage and awareness of audience. The study suggests that reading, discussing, and evaluating mentor texts can have a positive impact on the quality of children's independent writing
    • ā€¦
    corecore